Real-Time Estimation of Fast Egomotion with Feature Classification Using Compound Omnidirectional Vision Sensor
نویسندگان
چکیده
For fast egomotion of a camera, computing feature correspondence and motion parameters by global search becomes highly timeconsuming. Therefore, the complexity of the estimation needs to be reduced for real-time applications. In this paper, we propose a compound omnidirectional vision sensor and an algorithm for estimating its fast egomotion. The proposed sensor has both multi-baselines and a large field of view (FOV). Our method uses the multi-baseline stereo vision capability to classify feature points as near or far features. After the classification, we can estimate the camera rotation and translation separately by using random sample consensus (RANSAC) to reduce the computational complexity. The large FOV also improves the robustness since the translation and rotation are clearly distinguished. To date, there has been no work on combining multi-baseline stereo with large FOV characteristics for estimation, even though these characteristics are individually are important in improving egomotion estimation. Experiments showed that the proposed method is robust and produces reasonable accuracy in real time for fast motion of the sensor. key words: compound omnidirectional vision, multi-baseline stereo, large FOV, motion parameter separation, fast egomotion estimation, RANSAC
منابع مشابه
Using Symmetry as a Feature in Panoramic Images for Mobile Robot Applications
We propose to use symmetry as a global feature for mobile robot applications in an indoor environment. Our mobile robot solely uses an omnidirectional vision sensor consisting of a digital colour video camera and a hyperbolic mirror. Thus, robust image feature extraction is required for good performance in each application. The detection of symmetry is an effective natural vision routine result...
متن کاملReduced egomotion estimation drift using omnidirectional views
Estimation of camera motion from a given image sequence is a common task for multi-view 3D computer vision applications. Salient features (lines, corners etc.) in the images are used to estimate the motion of the camera, also called egomotion. This estimation suffers from an error built-up as the length of the image sequence increases and this causes a drift in the estimated position. In this l...
متن کاملEnvironmental Map Generation and Egomotion Dynamic Environment for an Omnidirectional Image Sensor
Generation of a stationary environmental map i s one of the important tasks for vision based robot navigation. Under the assumption of known motion of a robot, environmental maps of a real scene can be successfilly generated by monitoring azimuth changes in a n image. Several researchers have used this property for robot navagation However, it i s difficult to observe the exact motion parameter...
متن کاملA Real-Time Local Visual Feature for Omnidirectional Vision Based on FAST and CS-LBP
In this paper, a real-time local visual feature, namely FAST+CSLBP, is proposed for omnidirectional vision. It combines the advantages of two computationally simple operators by using Features from Accelerated Segment Test (FAST) as the feature detector, and Center-Symmetric Local Binary Patterns (CS-LBP) operator as the feature descriptor. The matching experiments of the panoramic images from ...
متن کاملInsect-Inspired Estimation of Egomotion
Tangential neurons in the fly brain are sensitive to the typical optic flow patterns generated during egomotion. In this study, we examine whether a simplified linear model based on the organization principles in tangential neurons can be used to estimate egomotion from the optic flow. We present a theory for the construction of an estimator consisting of a linear combination of optic flow vect...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEICE Transactions
دوره 93-D شماره
صفحات -
تاریخ انتشار 2010